Concentration behavior of the penalized least squares estimator
نویسندگان
چکیده
منابع مشابه
Penalized Least Squares and Penalized Likelihood
where pλ(·) is the penalty function. Best subset selection corresponds to pλ(t) = (λ/2)I(t 6= 0). If we take pλ(t) = λ|t|, then (1.2) becomes the Lasso problem (1.1). Setting pλ(t) = at + (1 − a)|t| with 0 ≤ a ≤ 1 results in the method of elastic net. With pλ(t) = |t| for some 0 < q ≤ 2, it is called bridge regression, which includes the ridge regression as a special case when q = 2. Some penal...
متن کاملComputing the Least Median of Squares Estimator
In modern statistics, the robust estimation of parameters of a regression hyperplane is a central problem, i. e., an estimation that is not or only slightly affected by outliers in the data. In this paper we will consider the least median of squares (LMS) estimator. For n points in d dimensions we describe a randomized algorithm for LMS running in O ( nd ) time and O(n) space, for d fixed, and ...
متن کاملThe Multivariate Least Trimmed Squares Estimator
In this paper we introduce the least trimmed squares estimator for multivariate regression. We give three equivalent formulations of the estimator and obtain its breakdown point. A fast algorithm for its computation is proposed. We prove Fisherconsistency at the multivariate regression model with elliptically symmetric error distribution and derive the influence function. Simulations investigat...
متن کاملVolterra filter identification using penalized least squares
Volterra lters have been applied to many nonlinear system identiication problems. However, obtaining good lter estimates from short and/or noisy data records is a diicult task. We propose a penalized least squares estimation algorithm and derive appropriate penalizing functionals for Volterra lters. An example demonstrates that penalized least squares estimation can provide much more accurate l...
متن کاملNonparametric regression estimation using penalized least squares
We present multivariate penalized least squares regression estimates. We use Vapnik{ Chervonenkis theory and bounds on the covering numbers to analyze convergence of the estimates. We show strong consistency of the truncated versions of the estimates without any conditions on the underlying distribution.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Statistica Neerlandica
سال: 2018
ISSN: 0039-0402
DOI: 10.1111/stan.12123